Low-rank optimization with convex constraints

نویسندگان

  • Christian Grussler
  • Anders Rantzer
  • Pontus Giselsson
چکیده

The problem of low-rank approximation with convex constraints, which often appears in data analysis, image compression and model order reduction, is considered. Given a data matrix, the objective is to find an approximation of desired lower rank that fulfills the convex constraints and minimizes the distance to the data matrix in the Frobenius-norm. The problem of matrix completion can be seen as a special case of this. Today, one of the most widely used techniques is to approximate this non-convex problem using convex nuclearnorm regularization. In many situations, this technique does not give solutions with desirable properties. We instead propose to use the optimal convex minorizer – the closed convex hull – of the Frobenius-norm and the rank constraint as a convex proxy. This optimal convex proxy can be combined with other convex constraints to form an optimal convex minorizer of the original non-convex problem. With this approach, we get easily verifiable conditions under which the solutions to the convex relaxation and the original non-convex problem coincide. Several numerical examples are provided for which that is the case. We also see that our proposed convex relaxation consistently performs better than the nuclear norm heuristic, especially in the matrix completion case. The expressibility and computational tractability is of great importance for a convex relaxation. We provide a closed-form expression for the proposed convex approximation, and show how to represent it as a semi-definite program. We also show how to compute the proximal operator of the convex approximation. This allows us to use scalable first-order methods to solve convex approximation problems of large size.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unifying Low-Rank Models for Visual Learning

Many problems in signal processing, machine learning and computer vision can be solved by learning low rank models from data. In computer vision, problems such as rigid structure from motion have been formulated as an optimization over subspaces with fixed rank. These hard -rank constraints have traditionally been imposed by a factorization that parameterizes subspaces as a product of two matri...

متن کامل

Low-Rank Inducing Norms with Optimality Interpretations

Optimization problems with rank constraints appear in many diverse fields such as control, machine learning and image analysis. Since the rank constraint is non-convex, these problems are often approximately solved via convex relaxations. Nuclear norm regularization is the prevailing convexifying technique for dealing with these types of problem. This paper introduces a family of low-rank induc...

متن کامل

Approximation of rank function and its application to the nearest low-rank correlation matrix

The rank function rank(·) is neither continuous nor convex which brings much difficulty to the solution of rank minimization problems. In this paper, we provide a unified framework to construct the approximation functions of rank(·), and study their favorable properties. Particularly, with two families of approximation functions, we propose a convex relaxation method for the rank minimization p...

متن کامل

Non-Convex Rank Minimization via an Empirical Bayesian Approach

In many applications that require matrix solutions of minimal rank, the underlying cost function is non-convex leading to an intractable, NP-hard optimization problem. Consequently, the convex nuclear norm is frequently used as a surrogate penalty term for matrix rank. The problem is that in many practical scenarios there is no longer any guarantee that we can correctly estimate generative low-...

متن کامل

Rank-constrained optimization and its applications

This paper investigates an iterative approach to solve the general rank-constrained optimization problems (RCOPs) defined to optimize a convex objective function subject to a set of convex constraints and rank constraints on unknown rectangular matrices. In addition, rank minimization problems (RMPs) are introduced and equivalently transformed into RCOPs by introducing a quadratic matrix equali...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1606.01793  شماره 

صفحات  -

تاریخ انتشار 2016